Learning Finite Beta-Liouville Mixture Models via Variational Bayes for Proportional Data Clustering
نویسندگان
چکیده
During the past decade, finite mixture modeling has become a well-established technique in data analysis and clustering. This paper focus on developing a variational inference framework to learn finite Beta-Liouville mixture models that have been proposed recently as an efficient way for proportional data clustering. In contrast to the conventional expectation maximization (EM) algorithm, commonly used for learning finite mixture models, the proposed algorithm has the advantages that it is more efficient from a computational point of view and by preventing overand under-fitting problems. Moreover, the complexity of the mixture model (i.e. the number of components) can be determined automatically and simultaneously with the parameters estimation in a closed form as part of the Bayesian inference procedure. The merits of the proposed approach are shown using both artificial data sets and two interesting and challenging real applications namely dynamic textures clustering and facial expression recognition.
منابع مشابه
Variational Learning for Finite Inverted Dirichlet Mixture Models and Its Applications
Variational Learning for Finite Inverted Dirichlet Mixture Models and Its Applications Parisa Tirdad Clustering is an important step in data mining, machine learning, computer vision and image processing. It is the process of assigning similar objects to the same subset. Among available clustering techniques, finite mixture models have been remarkably used, since they have the ability to consid...
متن کاملVariational Deep Embedding: An Unsupervised and Generative Approach to Clustering
Clustering is among the most fundamental tasks in computer vision and machine learning. In this paper, we propose Variational Deep Embedding (VaDE), a novel unsupervised generative clustering approach within the framework of Variational Auto-Encoder (VAE). Specifically, VaDE models the data generative procedure with a Gaussian Mixture Model (GMM) and a deep neural network (DNN): 1) the GMM pick...
متن کاملOnline Data Clustering Using Variational Learning of a Hierarchical Dirichlet Process Mixture of Dirichlet Distributions
This paper proposes an online clustering approach based on both hierarchical Dirichlet processes and Dirichlet distributions. The deployment of hierarchical Dirichlet processes allows to resolve difficulties related to model selection thanks to its nonparametric nature that arises in the face of unknown number of mixture components. The consideration of the Dirichlet distribution is justified b...
متن کاملVariational Bayes for Hierarchical Mixture Models
In recent years, sparse classification problems have emerged in many fields of study. Finite mixture models have been developed to facilitate Bayesian inference where parameter sparsity is substantial. Classification with finite mixture models is based on the posterior expectation of latent indicator variables. These quantities are typically estimated using the expectation-maximization (EM) alg...
متن کاملClustering on a Subspace of Exponential Family Using Variational Bayes Method
The e-PCA has been proposed to reduce the dimension of the parameters of probability distributions using Kullback information as a distance between two distributions. It also provides a framework for dealing with various data types such as binary and integer for which the Gaussian assumption on the data distribution is inappropriate. In this paper, we introduce a latent variable model for the e...
متن کامل